Skip to content

feat: add MiniMax as first-class LLM provider#300

Open
octo-patch wants to merge 1 commit intoalgorithmicsuperintelligence:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#300
octo-patch wants to merge 1 commit intoalgorithmicsuperintelligence:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax AI as a first-class LLM provider alongside OpenAI, Cerebras, and Azure OpenAI
  • MiniMax is detected via MINIMAX_API_KEY env var and uses its OpenAI-compatible API at https://api.minimax.io/v1
  • Temperature values are auto-clamped to MiniMax's valid range (0, 1] to prevent API errors

Changes

Provider Detection (optillm/server.py)

  • Added MINIMAX_API_KEY check in get_config() between Cerebras and OpenAI in priority order
  • Creates an OpenAI client with MiniMax base URL (https://api.minimax.io/v1) by default
  • Supports custom base_url override via --base-url flag
  • Properly passes SSL/httpx configuration to the client

Temperature Clamping (optillm/server.py)

  • Clamps temperature to 0.01 when <= 0 (MiniMax requires strictly positive temperature)
  • Clamps temperature to 1.0 when > 1.0
  • Only applies when MINIMAX_API_KEY is set; other providers are unaffected

Documentation (README.md)

  • Added MiniMax row to the provider table with API key, link, and notes

Tests

  • 17 unit tests (tests/test_minimax_provider.py): provider detection, base URL, SSL, priority ordering, temperature clamping
  • 3 integration tests (tests/test_minimax_integration.py): basic completion, temperature boundary, streaming (requires live API key)

Provider Priority Order

  1. OPTILLM_API_KEY - Local inference
  2. CEREBRAS_API_KEY - Cerebras
  3. MINIMAX_API_KEY - MiniMax (new)
  4. OPENAI_API_KEY - OpenAI
  5. AZURE_OPENAI_API_KEY - Azure OpenAI
  6. LiteLLM fallback

Usage

export MINIMAX_API_KEY="your-key"
optillm

# Then use with any approach:
# model="moa-MiniMax-M2.7" or model="bon-MiniMax-M2.7"

Test plan

  • All 17 unit tests pass
  • All 3 integration tests pass with live MiniMax API
  • Verify no regressions in existing provider detection (OpenAI, Cerebras, Azure)

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Add MiniMax AI (https://www.minimax.io/) as a directly supported LLM provider
alongside OpenAI, Cerebras, and Azure OpenAI. MiniMax's API is OpenAI-compatible,
so this uses the OpenAI SDK with MiniMax's base URL for seamless integration.

Changes:
- Add MINIMAX_API_KEY detection in get_config() with auto base URL
- Add temperature clamping for MiniMax: values are clamped to (0, 1]
- Update README provider table with MiniMax documentation
- Add 17 unit tests covering provider detection, priority, and temp clamping
- Add 3 integration tests for live API verification
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants